A geometric theory for preconditioned inverse iteration. III: A short and sharp convergence estimate for generalized eigenvalue problems
نویسندگان
چکیده
In two previous papers by Neymeyr: A geometric theory for preconditioned inverse iteration I: Extrema of the Rayleigh quotient, LAA 322: (1-3), 61-85, 2001, and A geometric theory for preconditioned inverse iteration II: Convergence estimates, LAA 322: (1-3), 87-104, 2001, a sharp, but cumbersome, convergence rate estimate was proved for a simple preconditioned eigensolver, which computes the smallest eigenvalue together with the corresponding eigenvector of a symmetric positive definite matrix, using a preconditioned gradient minimization of the Rayleigh quotient. In the present paper, we discover and prove a much shorter and more elegant, but still sharp in decisive quantities, convergence rate estimate of the same method that also holds for a generalized symmetric definite eigenvalue problem. The new estimate is simple enough to stimulate a search for a more straightforward proof technique that could be helpful to investigate such practically important method as the locally optimal block preconditioned conjugate gradient eigensolver. We demonstrate practical effectiveness of the latter for a model problem, where it compares favorably with two well-known Jacobi-Davidson type methods, JDQR and JDCG.
منابع مشابه
A New Inexact Inverse Subspace Iteration for Generalized Eigenvalue Problems
In this paper, we represent an inexact inverse subspace iteration method for computing a few eigenpairs of the generalized eigenvalue problem Ax = Bx [Q. Ye and P. Zhang, Inexact inverse subspace iteration for generalized eigenvalue problems, Linear Algebra and its Application, 434 (2011) 1697-1715 ]. In particular, the linear convergence property of the inverse subspace iteration is preserved.
متن کاملA Geometric Theory for Preconditioned Inverse Iteration Ii: Convergence Estimates
The topic of this paper is a convergence analysis of preconditioned inverse iteration (PINVIT). A sharp estimate for the eigenvalue approximations is derived; the eigenvector approximations are controlled by an upper bound for the residual vector. The analysis is mainly based on extremal properties of various quantities which define the geometry of PINVIT.
متن کاملA Geometric Convergence Theory for the Preconditioned Steepest Descent Iteration
Preconditioned gradient iterations for very large eigenvalue problems are efficient solvers with growing popularity. However, only for the simplest preconditioned eigensolver, namely the preconditioned gradient iteration (or preconditioned inverse iteration) with fixed step size, sharp non-asymptotic convergence estimates are known. These estimates require a properly scaled preconditioner. In t...
متن کاملA Geometric Theory for Preconditioned Inverse Iteration I: Extrema of the Rayleigh Quotient
The discretization of eigenvalue problems for partial differential operators is a major source of matrix eigenvalue problems having very large dimensions, but only some of the smallest eigenvalues together with the eigenvectors are to be determined. Preconditioned inverse iteration (a “matrix factorization–free” method) derives from the well–known inverse iteration procedure in such a way that ...
متن کاملGradient Flow Approach to Geometric Convergence Analysis of Preconditioned Eigensolvers
Preconditioned eigenvalue solvers (eigensolvers) are gaining popularity, but their convergence theory remains sparse and complex. We consider the simplest preconditioned eigensolver— the gradient iterative method with a fixed step size—for symmetric generalized eigenvalue problems, where we use the gradient of the Rayleigh quotient as an optimization direction. A sharp convergence rate bound fo...
متن کامل